1,644 research outputs found

    Brightness temperature constraints from interferometric visibilities

    Full text link
    The brightness temperature is an effective parameter that describes the physical properties of emitting material in astrophysical objects. It is commonly determined by imaging and modeling the structure of the emitting region and estimating its flux density and angular size. Reliable approaches for visibility-based estimates of brightness temperature are needed for interferometric experiments in which poor coverage of spatial frequencies prevents successful imaging of the source structure, for example, in interferometric measurements made at millimeter wavelengths or with orbiting antennas. Such approaches can be developed by analyzing the relations between brightness temperature and visibility amplitude and its r.m.s. error. A method is introduced for directly calculating the lower and upper limits of the brightness temperature from visibility measurements. The visibility-based brightness temperature estimates are shown to agree well with the image-based estimates obtained in the 2\,cm MOJAVE survey and the 3\,mm CMVA survey, with good agreement achieved for interferometric measurements at spatial frequencies exceeding ≈2×108\approx 2\times 10^8. The method provides an essential tool for constraining brightness temperature in all interferometric experiments with poor imaging capability.Comment: Accepted for publication in Astronomy and Astrophysics; 10 pages; 9 figure

    Multi-scale and Multi-directional VLBI Imaging with CLEAN

    Full text link
    Very long baseline interferometry (VLBI) is a radio-astronomical technique in which the correlated signal from various baselines is combined into an image of highest angular resolution. Due to sparsity of the measurements, this imaging procedure constitutes an ill-posed inverse problem. For decades the CLEAN algorithm was the standard choice in VLBI studies, although having some serious disadvantages and pathologies that are challenged by the requirements of modern frontline VLBI applications. We develop a novel multi-scale CLEAN deconvolution method (DoB-CLEAN) based on continuous wavelet transforms that address several pathologies in CLEAN imaging. We benchmark this novel algorithm against CLEAN reconstructions on synthetic data and reanalyze BL Lac observations of RadioAstron with DoB-CLEAN. DoB-CLEAN approaches the image by multi-scalar and multi-directional wavelet dictionaries. Two different dictionaries are used. Firstly, a difference of elliptical spherical Bessel functions dictionary fitted to the uv-coverage of the observation that is used to sparsely represent the features in the dirty image. Secondly, a difference of elliptical Gaussian wavelet dictionary that is well suited to represent relevant image features cleanly. The deconvolution is performed by switching between the dictionaries. DoB-CLEAN achieves super-resolution compared to CLEAN and remedies the spurious regularization properties of CLEAN. In contrast to CLEAN, the representation by basis functions has a physical meaning. Hence, the computed deconvolved image still fits the observed visibilities, opposed to CLEAN. State-of-the-art multi-scalar imaging approaches seem to outperform single-scalar standard approaches in VLBI and are well suited to maximize the extraction of information in ongoing frontline VLBI applications.Comment: Accepted for publication in A&
    • …
    corecore